An increasing-angle property of the conjugate gradient method and the implementation of large-scale minimization algorithms with line searches

نویسندگان

  • Yu-Hong Dai
  • José Mario Martínez
  • Jin Yun Yuan
چکیده

The search direction in unconstrained minimization algorithms for large scale problems is usually computed as an iterate of the (precondi-tioned) conjugate gradient method applied to the minimization of a local quadratic model. In line-search procedures this direction is required to satisfy an angle condition, that says that the angle between the negative gradient at the current point and the direction is bounded away from =2. In this paper it is shown that the angle between conjugate gradient iterates and the negative gradient strictly increases as far as the conjugate gradient algorithm proceeds. Therefore, the interruption of the conjugate gradient sub-algorithm when the angle condition does not hold is theoretically justiied.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations

Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

Comparison the Sensitivity Analysis and Conjugate Gradient algorithms for Optimization of Opening and Closing Angles of Valves to Reduce Fuel Consumption in XU7/L3 Engine

In this study it has been tried, to compare results and convergence rate of sensitivity analysis and conjugate gradient algorithms to reduce fuel consumption and increasing engine performance by optimizing the timing of opening and closing valves in XU7/L3 engine. In this study, considering the strength and accuracy of simulation GT-POWER software in researches on the internal combustion engine...

متن کامل

Extensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property

Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...

متن کامل

An eigenvalue study on the sufficient descent property of a‎ ‎modified Polak-Ribière-Polyak conjugate gradient method

‎Based on an eigenvalue analysis‎, ‎a new proof for the sufficient‎ ‎descent property of the modified Polak-Ribière-Polyak conjugate‎ ‎gradient method proposed by Yu et al‎. ‎is presented‎.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Numerical Lin. Alg. with Applic.

دوره 10  شماره 

صفحات  -

تاریخ انتشار 2003